Intro to generalization; Chernoff method
ثبت نشده
چکیده
i=1 `(f(xi), yi)), f̄ := ”argminf∈F”R(f), f̄n := ”argminf∈F”R̂(f), f̂ output of optimization algorithm, ḡ := ”argming measurable”R(g). (As usual, argmin has technical issues we are avoiding, hence the quotes.) The goal in a machine learning problem is to make an algorithm that outputs f̂ so that R(f̂)−R(ḡ) is small. We can decompose this error into the following pieces: R(f̂)−R(ḡ) = R(f̂)− R̂(f̂) (4) + R̂(f̂)− R̂(f̄n) ( ) + R̂(f̄n)− R̂(f̄) (♠) + R̂(f̄)−R(f̄) (4) + R̂(f̄)−R(f̄) ( ). These terms can be controlled as follows. (This course was designed around this decomposition!) • (4). Third part of this course: generalization/statistics. • ( ). Second part of this course: optimization. • (♠). This one is ≤ 0 by choice of f̄n. • ( ). First part of this course: representation/approximation. So the rest of this course is part 3, the statistical problem above (comparing R̂ and R), and then we’ll leave a few lectures for some advanced/miscellaneous topics.
منابع مشابه
Lecture 03 : Chernoff Bounds and Intro to Spectral Graph Theory 3 1 . 1 Hoeffding ’ s Inequality
متن کامل
Chernoff type bounds for sum of dependent random variables and applications in additive number theory
We present generalizations of Chernoff’s large deviation bound for sum of dependent random variables. These generalizations seem to be very useful for the Erdős probabilistic method. As an illustrating application, we sketch the solution of an old problem of Nathanson [14] concerning thin Waring bases.
متن کاملImproving Chernoff criterion for classification by using the filled function
Linear discriminant analysis is a well-known matrix-based dimensionality reduction method. It is a supervised feature extraction method used in two-class classification problems. However, it is incapable of dealing with data in which classes have unequal covariance matrices. Taking this issue, the Chernoff distance is an appropriate criterion to measure distances between distributions. In the p...
متن کاملDerandomizing the Ahlswede-Winter matrix-valued Chernoff bound using pessimistic estimators, and applications
Ahlswede and Winter [IEEE Trans. Inf. Th. 2002] introduced a Chernoff bound for matrix-valued random variables, which is a non-trivial generalization of the usual Chernoff bound for real-valued random variables. We present an efficient derandomization of their bound using the method of pessimistic estimators (see Raghavan [JCSS 1988]). As a consequence, we derandomize an efficient construction ...
متن کاملDerandomizing the AW matrix-valued Chernoff bound using pessimistic estimators and applications
Ahlswede and Winter [AW02] introduced a Chernoff bound for matrix-valued random variables, which is a non-trivial generalization of the usual Chernoff bound for real-valued random variables. We present an efficient derandomization of their bound using the method of pessimistic estimators (see Raghavan [Rag88]). As a consequence, we derandomize a construction of Alon and Roichman [AR94] (see als...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2016